An overview of differential item functioning in multistage computer adaptive testing using three-parameter logistic item response theory

نویسندگان

  • Karim Sadeghi
  • Zainab Abolfazli Khonbi
چکیده

As perfectly summarised by Ida Lawrence, “Testing is growing by leaps and bounds across the world. There is a realization that a nation’s well-being depends crucially on the educational achievement of its population. Valid tests are an essential tool to evaluate a nation’s educational standing and to implement efficacious educational reforms. Because tests consume time that otherwise could be devoted to instruction, it is important to devise tests that are efficient. Doing so requires a careful balancing of the contributions of technology, psychometrics, test design, and the learning sciences. Computer adaptive multistage testing (MSCAT) fits the bill extraordinarily well; unlike other forms of adaptive testing, it can be adapted to educational surveys and student testing. Research in this area will be an evidence that the methodologies and underlying technology that surround MSCAT have reached maturity and that there is a growing acceptance by the field of this type of test design” (from the Foreword to Y. Duanli, A. A. von Davier, & L. Charles (Eds.), Computerized multistage testing: theory and application). This state-of-the-art paper aims to present an overview of differential item functioning (DIF) in MSCAT using three-parameter logistic item response theory (IRT), offering suggestions to implement it in practice with a hope to motivate testing and assessment researchers and practitioners to initiate projects in this under-practiced area by helping them to better understand some of the relevant technical concepts.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Differential Item Functioning (DIF) in Terms of Gender in the Reading Comprehension Subtest of a High-Stakes Test

Validation is an important enterprise especially when a test is a high stakes one. Demographic variables like gender and field of study can affect test results and interpretations. Differential Item Functioning (DIF) is a way to make sure that a test does not favor one group of test takers over the others. This study investigated DIF in terms of gender in the reading comprehension subtest (35 i...

متن کامل

A confirmatory study of Differential Item Functioning on EFL reading comprehension

The  present  study  aimed  at  investigating  DIF  sources  on  an  EFL  reading  comprehension test.  Accordingly,  2  DIF  detection  methods,  logistic  regression  (LR)  and  item  response theory  (IRT),  were  used  to  flag  emergent  DIF  of  203  (110  females  &  93  males)  Iranian EFL examinees’ performance on a reading comprehension test. Seven hypothetical DIF sources were examin...

متن کامل

Selecting the Best Fit Model in Cognitive Diagnostic Assessment: Differential Item Functioning Detection in the Reading Comprehension of the PhD Nationwide Admission Test

This study was an attemptto provide detailed information of the strengths and weaknesses of test takers‟ real ability through cognitive diagnostic assessment, and to detect differential item functioning in each test item. The rationale for using CDA was that it estimates an item‟s discrimination power, whereas clas- sical test theory or item response theory depicts between rather within item mu...

متن کامل

Using Multiple-Variable Matching to Identify EFL Ecological Sources of Differential Item Functioning

Context is a vague notion with numerous building blocks making language test scores inferences quite convoluted. This study has made use of a model of item responding that has striven to theorize the contextual infrastructure of differential item functioning (DIF) research and help specify the sources of DIF. Two steps were taken in this research: first, to identify DIF by gender grouping via l...

متن کامل

Adaptive Mastery Testing Using the Rasch Model and Bayesian Sequential Decision Theory

A version of sequential mastery testing is studied in which response behavior is modeled by an item response theory (IRT) model. First, a general theoretical framework is sketched that is based on a combination of Bayesian sequential decision theory and item response theory. A discussion follows on how IRT based sequential mastery testing can be generalized to adaptive item and testlet selectio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017